13 research outputs found

    On Reachability Analysis of Pushdown Systems with Transductions: Application to Boolean Programs with Call-by-Reference

    Get PDF
    Pushdown systems with transductions (TrPDSs) are an extension of pushdown systems (PDSs) by associating each transition rule with a transduction, which allows to inspect and modify the stack content at each step of a transition rule. It was shown by Uezato and Minamide that TrPDSs can model PDSs with checkpoint and discrete-timed PDSs. Moreover, TrPDSs can be simulated by PDSs and the predecessor configurations pre^*(C) of a regular set C of configurations can be computed by a saturation procedure when the closure of the transductions in TrPDSs is finite. In this work, we comprehensively investigate the reachability problem of finite TrPDSs. We propose a novel saturation procedure to compute pre^*(C) for finite TrPDSs. Also, we introduce a saturation procedure to compute the successor configurations post^*(C) of a regular set C of configurations for finite TrPDSs. From these two saturation procedures, we present two efficient implementation algorithms to compute pre^*(C) and post^*(C). Finally, we show how the presence of transductions enables the modeling of Boolean programs with call-by-reference parameter passing. The TrPDS model has finite closure of transductions which results in model-checking approach for Boolean programs with call-by-reference parameter passing against safety properties

    SmartUnit: Empirical Evaluations for Automated Unit Testing of Embedded Software in Industry

    Full text link
    In this paper, we aim at the automated unit coverage-based testing for embedded software. To achieve the goal, by analyzing the industrial requirements and our previous work on automated unit testing tool CAUT, we rebuild a new tool, SmartUnit, to solve the engineering requirements that take place in our partner companies. SmartUnit is a dynamic symbolic execution implementation, which supports statement, branch, boundary value and MC/DC coverage. SmartUnit has been used to test more than one million lines of code in real projects. For confidentiality motives, we select three in-house real projects for the empirical evaluations. We also carry out our evaluations on two open source database projects, SQLite and PostgreSQL, to test the scalability of our tool since the scale of the embedded software project is mostly not large, 5K-50K lines of code on average. From our experimental results, in general, more than 90% of functions in commercial embedded software achieve 100% statement, branch, MC/DC coverage, more than 80% of functions in SQLite achieve 100% MC/DC coverage, and more than 60% of functions in PostgreSQL achieve 100% MC/DC coverage. Moreover, SmartUnit is able to find the runtime exceptions at the unit testing level. We also have reported exceptions like array index out of bounds and divided-by-zero in SQLite. Furthermore, we analyze the reasons of low coverage in automated unit testing in our setting and give a survey on the situation of manual unit testing with respect to automated unit testing in industry.Comment: In Proceedings of 40th International Conference on Software Engineering: Software Engineering in Practice Track, Gothenburg, Sweden, May 27-June 3, 2018 (ICSE-SEIP '18), 10 page

    Prema: A Tool for Precise Requirements Editing, Modeling and Analysis

    Full text link
    We present Prema, a tool for Precise Requirement Editing, Modeling and Analysis. It can be used in various fields for describing precise requirements using formal notations and performing rigorous analysis. By parsing the requirements written in formal modeling language, Prema is able to get a model which aptly depicts the requirements. It also provides different rigorous verification and validation techniques to check whether the requirements meet users' expectation and find potential errors. We show that our tool can provide a unified environment for writing and verifying requirements without using tools that are not well inter-related. For experimental demonstration, we use the requirements of the automatic train protection (ATP) system of CASCO signal co. LTD., the largest railway signal control system manufacturer of China. The code of the tool cannot be released here because the project is commercially confidential. However, a demonstration video of the tool is available at https://youtu.be/BX0yv8pRMWs.Comment: accepted by ASE2019 demonstration trac

    FREPA: An Automated and Formal Approach to Requirement Modeling and Analysis in Aircraft Control Domain

    Full text link
    Formal methods are promising for modeling and analyzing system requirements. However, applying formal methods to large-scale industrial projects is a remaining challenge. The industrial engineers are suffering from the lack of automated engineering methodologies to effectively conduct precise requirement models, and rigorously validate and verify (V&V) the generated models. To tackle this challenge, in this paper, we present a systematic engineering approach, named Formal Requirement Engineering Platform in Aircraft (FREPA), for formal requirement modeling and V\&V in the aerospace and aviation control domains. FREPA is an outcome of the seamless collaboration between the academy and industry over the last eight years. The main contributions of this paper include 1) an automated and systematic engineering approach FREPA to construct requirement models, validate and verify systems in the aerospace and aviation control domain, 2) a domain-specific modeling language AASRDL to describe the formal specification, and 3) a practical FREPA-based tool AeroReq which has been used by our industry partners. We have successfully adopted FREPA to seven real aerospace gesture control and two aviation engine control systems. The experimental results show that FREPA and the corresponding tool AeroReq significantly facilitate formal modeling and V&V in the industry. Moreover, we also discuss the experiences and lessons gained from using FREPA in aerospace and aviation projects.Comment: 12 pages, Published by FSE 202

    A Formal Engineering Approach to Service-based Software Modeling and Integration Testing

    No full text

    A Formal Engineering Approach to Service-based Software Modeling and Integration Testing

    Get PDF
    With the increasing popularity of service-based software in recent years, engineering methods for developing high quality service-based systems is highly demanded which are expected to support the essential engineering processes of system modeling, web service selection and system testing. However, few systematic methods that unify the above three essential activities are available, and the present supporting technologies of these three activities are still not satisfactory. In order to tackle this challenge, this paper proposes a formal engineering approach that integrates precise system modeling, accurate service selection and rigorous system integration testing. It includes a unified three-step formal engineering framework for interactive service-based system modeling and existing service adoption, a service selection method that combines both static matching and specification-based conformance testing and a formal specification-based integration testing method. We have also developed a prototype tool that supports the proposed engineering framework. An empirical case study and corresponding experiments are conducted to show the feasibility of the proposed approach

    A Formal Engineering Framework for Service-Based Software Modeling

    No full text

    Evaluation and analysis of stochastic modeling of BeiDou GEO/IGSO/MEO satellite observation

    Get PDF
    Firstly, the importance of stochastic model in precise positioning is demonstrated from the perspectives of parameter estimation, accuracy evaluation and quality control. Then, based on the single-difference functional model, the rigorous variance component estimation (VCE) method is used to allow the estimation of satellite-specific variances, cross correlations between two arbitrary frequencies, as well as the time correlations for phase and c+ode observations per frequency. The influence of the stochastic model on baseline precisions and the overall statistics was subsequently analyzed. The results show that the observation precisions of the BeiDou user receiver are overall elevation-dependent for phase and code of all frequencies. It is recommended to use the elevation-dependent exponent weighting model; there are different degrees of correlation between the three frequency phase observations, and the cross correlation between other types of observations is not obvious. The time correlation between phase and code observations of different frequencies is obvious, and attention should be paid to high-precision positioning. Furthermore,the baseline precisions that used the correct stochastic model match the theoretical ones very well for the three baseline components. The paper provides support for users to correctly understand the three types of satellite observation of the BeiDou system and correctly apply the BeiDou system

    Dodging DeepFake Detection via Implicit Spatial-Domain Notch Filtering

    Full text link
    The current high-fidelity generation and high-precision detection of DeepFake images are at an arms race. We believe that producing DeepFakes that are highly realistic and ``detection evasive'' can serve the ultimate goal of improving future generation DeepFake detection capabilities. In this paper, we propose a simple yet powerful pipeline to reduce the artifact patterns of fake images without hurting image quality by performing implicit spatial-domain notch filtering. We first demonstrate that frequency-domain notch filtering, although famously shown to be effective in removing periodic noise in the spatial domain, is infeasible for our task at hand due to manual designs required for the notch filters. We, therefore, resort to a learning-based approach to reproduce the notch filtering effects, but solely in the spatial domain. We adopt a combination of adding overwhelming spatial noise for breaking the periodic noise pattern and deep image filtering to reconstruct the noise-free fake images, and we name our method DeepNotch. Deep image filtering provides a specialized filter for each pixel in the noisy image, producing filtered images with high fidelity compared to their DeepFake counterparts. Moreover, we also use the semantic information of the image to generate an adversarial guidance map to add noise intelligently. Our large-scale evaluation on 3 representative state-of-the-art DeepFake detection methods (tested on 16 types of DeepFakes) has demonstrated that our technique significantly reduces the accuracy of these 3 fake image detection methods, 36.79% on average and up to 97.02% in the best case.Comment: 10 page
    corecore